Will Knight
MIT Technology Review
Intel is cutting 12,000 workers as it faces the financial consequences of underestimating a profound shift in computing from desktop computers to pocket-sized devices.
And more trouble may lie ahead. The rate at which Intel makes technological advances suddenly seems to be slowing, and other looming trends, including artificial intelligence and perhaps virtual reality, look set to benefit a different kind of computer architecture.
The job cuts are a sign that Intel misjudged the speed with which people would abandon desktops in favor of smartphones and tablets, and failed to reposition its product line to ride that revolution. Only last week the research company Gartner reported that PC shipments were down 9.6 percent in the first quarter of the year.
Intel is perhaps also guilty of focusing too heavily on wringing ever-more power out of computer chips, when power-efficiency is just as important in mobile devices. Intel does have a line of mobile processers, but most mobile devices are based on a rival architecture licensed from a British company called ARM.
The company is now finding that the rate at which it can squeeze twice as much power out of its chips, something dubbed Moore’s Law after the company’s founder, Gordon Moore, is slowing down.
And while Intel says it will refocus its attention on cloud computing and devices for the Internet of things, it risks missing out on several up-and-coming opportunities. Artificial intelligence and virtual reality are already feeding demand for a completely different type of chip architectures.
Last week, I spent a few days at a developer conference in San Jose organized by Nvidia, a chip company that makes graphics processing units, or GPUs. This type of chip is especially good for the kind of parallel computations companies are harnessing to perform deep learning (a powerful kind of machine learning); and of course they are geared toward rendering the highly realistic 3-D environments needed for virtual reality. Indeed, the Nvidia event was filled with demos of self-driving cars, deep-learning systems, and virtual-reality headsets.
So beyond cutting jobs, Intel might need to think about how it can feed the industry’s appetite for AI (Artificial Intelligence) and VR (Virtual Reality), if it doesn’t want to miss the next big shift in how we use computers.